Approximating Gaussian Processes with H2-Matrices

نویسندگان

  • Steffen Börm
  • Jochen Garcke
چکیده

To compute the exact solution of Gaussian process regression one needs O(N) computations for direct and O(N) for iterative methods since it involves a densely populated kernel matrix of size N×N , here N denotes the number of data. This makes large scale learning problems intractable by standard techniques. We propose to use an alternative approach: the kernel matrix is replaced by a data-sparse approximation, called an H-matrix. This matrix can be represented by only O(Nm) units of storage, where m is a parameter controlling the accuracy of the approximation, while the computation of the H-matrix scales with O(Nm logN). Practical experiments demonstrate that our scheme leads to significant reductions in storage requirements and computing times for large data sets in lower dimensional spaces.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Matrices with banded inverses: Inversion algorithms and factorization of Gauss-Markov processes

The paper considers the inversion of full matrices whose inverses are -banded. We derive a nested inversion algorithm for such matrices. Applied to a tridiagonal matrix, the algorithm provides its explicit inverse as an element-wise product (Hadamard product) of three matrices. When related to Gauss–Markov random processes (GMrp), this result provides a closed-form factored expression for the c...

متن کامل

On Gaussian comparison inequality and its application to spectral analysis of large random matrices

Recently, Chernozhukov, Chetverikov, and Kato [Ann. Statist. 42 (2014) 1564–1597] developed a new Gaussian comparison inequality for approximating the suprema of empirical processes. This paper exploits this technique to devise sharp inference on spectra of large random matrices. In particular, we show that two long-standing problems in random matrix theory can be solved: (i) simple bootstrap i...

متن کامل

Multiresolution Kernel Approximation for Gaussian Process Regression

(a) (b) (c) Figure: (a) In a simple blocked low rank approximation the diagonal blocks are dense (gray), whereas the off-diagonal blocks are low rank. (b) In an HODLR matrix the low rank off-diagonal blocks form a hierarchical structure leading to a much more compact representation. (c) H2 matrices are a refinement of this idea. (a) In simple blocked low rank approximation the diagonal blocks a...

متن کامل

Fastfood — Approximating Kernel Expansions in Loglinear Time

Despite their successes, what makes kernel methods difficult to use in many large scale problems is the fact that computing the decision function is typically expensive, especially at prediction time. In this paper, we overcome this difficulty by proposing Fastfood, an approximation that accelerates such computation significantly. Key to Fastfood is the observation that Hadamard matrices when c...

متن کامل

Complete convergence of moving-average processes under negative dependence sub-Gaussian assumptions

The complete convergence is investigated for moving-average processes of doubly infinite sequence of negative dependence sub-gaussian random variables with zero means, finite variances and absolutely summable coefficients. As a corollary, the rate of complete convergence is obtained under some suitable conditions on the coefficients.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007